1,201 research outputs found
Storage Capacity Diverges with Synaptic Efficiency in an Associative Memory Model with Synaptic Delay and Pruning
It is known that storage capacity per synapse increases by synaptic pruning
in the case of a correlation-type associative memory model. However, the
storage capacity of the entire network then decreases. To overcome this
difficulty, we propose decreasing the connecting rate while keeping the total
number of synapses constant by introducing delayed synapses. In this paper, a
discrete synchronous-type model with both delayed synapses and their prunings
is discussed as a concrete example of the proposal. First, we explain the
Yanai-Kim theory by employing the statistical neurodynamics. This theory
involves macrodynamical equations for the dynamics of a network with serial
delay elements. Next, considering the translational symmetry of the explained
equations, we re-derive macroscopic steady state equations of the model by
using the discrete Fourier transformation. The storage capacities are analyzed
quantitatively. Furthermore, two types of synaptic prunings are treated
analytically: random pruning and systematic pruning. As a result, it becomes
clear that in both prunings, the storage capacity increases as the length of
delay increases and the connecting rate of the synapses decreases when the
total number of synapses is constant. Moreover, an interesting fact becomes
clear: the storage capacity asymptotically approaches due to random
pruning. In contrast, the storage capacity diverges in proportion to the
logarithm of the length of delay by systematic pruning and the proportion
constant is . These results theoretically support the significance of
pruning following an overgrowth of synapses in the brain and strongly suggest
that the brain prefers to store dynamic attractors such as sequences and limit
cycles rather than equilibrium states.Comment: 27 pages, 14 figure
Mean Field Analysis of Stochastic Neural Network Models with Synaptic Depression
We investigated the effects of synaptic depression on the macroscopic
behavior of stochastic neural networks. Dynamical mean field equations were
derived for such networks by taking the average of two stochastic variables: a
firing state variable and a synaptic variable. In these equations, their
average product is decoupled as the product of averaged them because the two
stochastic variables are independent. We proved the independence of these two
stochastic variables assuming that the synaptic weight is of the order of 1/N
with respect to the number of neurons N. Using these equations, we derived
macroscopic steady state equations for a network with uniform connections and a
ring attractor network with Mexican hat type connectivity and investigated the
stability of the steady state solutions. An oscillatory uniform state was
observed in the network with uniform connections due to a Hopf instability.
With the ring network, high-frequency perturbations were shown not to affect
system stability. Two mechanisms destabilize the inhomogeneous steady state,
leading two oscillatory states. A Turing instability leads to a rotating bump
state, while a Hopf instability leads to an oscillatory bump state, which was
previous unreported. Various oscillatory states take place in a network with
synaptic depression depending on the strength of the interneuron connections.Comment: 26 pages, 13 figures. Preliminary results for the present work have
been published elsewhere (Y Igarashi et al., 2009.
http://www.iop.org/EJ/abstract/1742-6596/197/1/012018
- …